home *** CD-ROM | disk | FTP | other *** search
- Outline
-
- 1. Multilayer Perceptron (MLP) Options
- 2. Error Functions
-
-
-
- 1. Multilayer Perceptron (MLP) Options
- a. Train an MLP using backpropagation (BP). Batching, which
- denotes the accumulation of weight changes over portions of the
- training set before they are used, is an option. The learning
- factor changes adaptively.
- b. Fast training of MLP networks. Trains networks one or two
- orders of magnitude faster than BP.
- c. Analyze and prune trained MLPs. The non-demo version produces weight
- and network structure files for the pruned network.
- d. Process data using a trained MLP. Data may or may not include
- desired outputs.
- e. Create MLP subroutine. Given a network structure file and a
- weight file, creates an MLP subroutine in Fortran with a parameter
- list that includes only the input array and output array.
- f. Create formatted weight file. Given a network structure file and a
- weight file, creates a formatted weight file that clearly shows
- the different connections and their weights and thresholds.
-
- 2. Error Functions
-
- a. The error function that is being minimized during backpropagation
- training and fast training is
-
- Npat Nout 2
- MSE = (1/Npat) SUM SUM [ Tpk - Opk ]
- p=1 k=1
-
- where Npat is the number of training patterns, Nout is the number
- of network output nodes, Tpk is the desired output for the pth
- training pattern and the kth output, and Opk is the actual output
- for the pth training pattern and the kth output. The desired
- output Tpk is 0 for the correct class and 1 for other classes.
- MSE is printed for each iteration.
-
- b. The error percentage that is printed out during training is
-
- Err = 100 x (number of patterns misclassified/Npat).